Curse-of-dimensionality revisited: Collapse of importance sampling in very large scale systems

نویسندگان

  • Bo Li
  • Thomas Bengtsson
  • Peter Bickel
چکیده

It has been widely realized that Monte Carlo methods (approximation via a sample ensemble) may fail in large scale systems. This work offers some theoretical insight into this phenomenon. In the context of a particle filter (as well as in general importance samplers), we demonstrate that the maximum of the weights associated with the sample ensemble members converges to one as both sample size and system dimension tends to infinity. Under fairly weak assumptions, this convergence is shown to hold for both a Gaussian case and for a more general case with iid kernels. Similar singularity behavior is also shown to hold for non-Gaussian, spherically symmetric kernels (e.g. multivariate Cauchy distribution). In addition, in certain large scale settings, we show that the estimator of an expectation based on importance sampling converges weakly to a law, rather than the target constant. Our work is presented and discussed in the context of atmospheric data assimilation for numerical weather prediction.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Curse-of-dimensionality revisited: Collapse of the particle filter in very large scale systems

It has been widely realized that Monte Carlo methods (approximation via a sample ensemble) may fail in large scale systems. This work offers some theoretical insight into this phenomenon in the context of the particle filter. We demonstrate that the maximum of the weights associated with the sample ensemble converges to one as both the sample size and the system dimension tends to infinity. Spe...

متن کامل

Quick Training of Probabilistic Neural Nets by Importance Sampling

Our previous work on statistical language modeling introduced the use of probabilistic feedforward neural networks to help dealing with the curse of dimensionality. Training this model by maximum likelihood however requires for each example to perform as many network passes as there are words in the vocabulary. Inspired by the contrastive divergence model, we propose and evaluate sampling-based...

متن کامل

How to avoid the curse of dimensionality: scalability of particle filters with and without importance weights

Particle filters are a popular and flexible class of numerical algorithms to solve a large class of nonlinear filtering problems. However, standard particle filters with importance weights have been shown to require a sample size that increases exponentially with the dimension D of the state space in order to achieve a certain performance, which precludes their use in very high-dimensional filt...

متن کامل

Deterministic Importance Sampling with Error Diffusion

This paper proposes a deterministic importance sampling algorithm that is based on the recognition that deltasigma modulation is equivalent to importance sampling. We propose a generalization for delta-sigma modulation in arbitrary dimensions, taking care of the curse of dimensionality as well. Unlike previous sampling techniques that transform low-discrepancy and highly stratified samples in t...

متن کامل

Large-scale games in large-scale systems

Many real-world problems modeled by stochastic games have huge state and/or action spaces, leading to the well-known curse of dimensionality. The complexity of the analysis of large-scale systems is dramatically reduced by exploiting mean field limit and dynamical system viewpoints. Under regularity assumptions and specific time-scaling techniques, the evolution of the mean field limit can be e...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2005